New Riemannian Priors on the Univariate Normal Model

نویسندگان

  • Salem Said
  • Lionel Bombrun
  • Yannick Berthoumieu
چکیده

The current paper introduces new prior distributions on the univariate normal model, with the aim of applying them to the classification of univariate normal populations. These new prior distributions are entirely based on the Riemannian geometry of the univariate normal model, so that they can be thought of as “Riemannian priors”. Precisely, if {pθ; θ ∈ Θ} is any parametrization of the univariate normal model, the paper considers prior distributions G(θ̄, γ) with hyperparameters θ̄ ∈ Θ and γ > 0, whose density with respect to Riemannian volume is proportional to exp(−d(θ, θ̄)/2γ), where d(θ, θ̄) is the square of Rao’s Riemannian distance. The distributions G(θ̄, γ) are termed Gaussian distributions on the univariate normal model. The motivation for considering a distribution G(θ̄, γ) is that this distribution gives a geometric representation of a class or cluster of univariate normal populations. Indeed, G(θ̄, γ) has a unique mode θ̄ (precisely, θ̄ is the unique Riemannian center of mass of G(θ̄, γ), as shown in the paper), and its dispersion away from θ̄ is given by γ. Therefore, one thinks of members of the class represented by G(θ̄, γ) as being centered around θ̄ and lying within a typical distance determined by γ. The paper defines rigorously the Gaussian distributions G(θ̄, γ) and describes an algorithm for computing maximum likelihood estimates of their hyperparameters. Based on this algorithm and on the Laplace approximation, it describes how the distributions G(θ̄, γ) can be used as prior distributions for Bayesian classification of large univariate normal populations. In a concrete application to texture image classification, it is shown that this leads to an improvement in performance over the use of conjugate priors. Entropy 2014, 16 4016

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Kullback-Leibler Divergence for the Normal-Gamma Distribution

We derive the Kullback-Leibler divergence for the normal-gamma distribution and show that it is identical to the Bayesian complexity penalty for the univariate general linear model with conjugate priors. Based on this finding, we provide two applications of the KL divergence, one in simulated and one in empirical data.

متن کامل

Fast Exact Bayesian Inference for the Hierarchical Normal Model: Solving the Improper Posterior Problem

The hierarchical normal-normal model considered. Standard Empirical Bayes methods underestimate variability because they ignore uncertainty about the hyperparameters. Bayes' theorem solves this problem. We provide fast, exact inference that requires only a simple, univariate numerical integration to obtain the posterior distribution of the means. However, when standard, scale-invariant, vague p...

متن کامل

Umbilicity of (Space-Like) Submanifolds of Pseudo-Riemannian Space Forms

We study umbilic (space-like) submanifolds of pseudo-Riemannian space forms, then define totally semi-umbilic space-like submanifold of pseudo Euclidean space and relate this notion to umbilicity. Finally we give characterization of total semi-umbilicity for space-like submanifolds contained in pseudo sphere or pseudo hyperbolic space or the light cone.A pseudo-Riemannian submanifold M in (a...

متن کامل

On a New Bimodal Normal Family

The unimodal distributions are frequently used in the theorical statistical studies. But in applied statistics, there are many situations in which the unimodal distributions can not be fitted to the data. For example, the distribution of the data outside the control zone in quality control or outlier observations in linear models and time series may require to be a bimodal. These situations, oc...

متن کامل

Geometric and Topological Invariants of the Hypothesis Space

The form and shape of a hypothesis space imposes natural objective constraints to any inferential process. This contribution summarizes what is currently known and the mathematics that are thought to be needed for new developments in this area. For example, it is well known that the quality of best possible estimators deteriorates with increasing volume, dimension and curvature of the hypothesi...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Entropy

دوره 16  شماره 

صفحات  -

تاریخ انتشار 2014